On January 6th, 2021 a mob of Donald Trump’s supporters stormed the United States Capitol in Washington, DC. One rioter was killed during the massive brawl between protestors and police, and eight more people died in the weeks after. The protesters were led to believe that the 2020 Presidential Election was stolen from Donald Trump, and that it was their patriotic duty to prevent the delegates from casting their illegal votes for Joe Biden (Bodner). Who is responsible for propagating such obvious untruths? Fingers in the mainstream media point in the direction of Donald Trump and his political allies, but also to QAnon: a dense, complex web of conspiracy theories that range from Holocaust denial to accusations of Satanic worship. Paranoia and conspiratorial thinking have played a role in American politics since the country’s founding (Hofstadter). But no single conspiracy theory in recent memory has been responsible for such a catastrophe as the 2021 United States Capitol attack. Two prominent freshman congresswomen are proponents of the QAnon conspiracy theory. How has QAnon come to play such a significant role in contemporary American politics?
Introduction and Methodology
The QAnon conspiracy theory began as a series of anonymous posts on the imageboard website 8chan. The author, known only as “Q”, claimed to be a member of the Trump administration with “Q-level” security clearance. The cryptic posts alleged that the Trump administration was at war with the “Deep State,” a secret network of political enemies embedded within America’s bureaucracy, and promised a day of reckoning in which these enemies would be arrested or executed (Bodner). While the term originally referred to the secret networks of power hidden within Turkish politics, the Deep State of QAnon is more loosely defined as anyone who intends to undermine Trump’s agenda. While the posts make clear that it is not made up entirely of Democratic politicians (most likely to protect it from accusations of pure partisanship), the names most frequently associated with the Deep State are Hillary Clinton, Barack Obama, John Podesta, and Nancy Pelosi (Bodner). The posts start out relatively tame – while not necessarily grounded in any observable truth, the accusations aren’t wildly outrageous. This makes sense, since 8chan’s user base at the time is made up of mostly internet-savvy young people. But QAnon has the unique ability to envelop seemingly-unrelated conspiracies. Early in its development, after the leak of John Podesta’s private emails, QAnon incorporated the Pizzagate theory into its mythos. It alleged that higher-ups of the Democratic Party were using Comet Ping Pong, a pizza restaurant in Washington D.C., as headquarters for the trafficking of children. This led to theories about Satanism, adrenochrome, spirit cooking, Jeffrey Epstein, and more (Bodner). Since Pizzagate, QAnon has successfully absorbed nearly every other right-wing conspiracy theory into its canon to some extent. From vaccine misinformation to Holocaust denial to Flat Earth, no theory is safe from its reach (Packer). The most apparent trend, though, is QAnon’s evolution from a relatively secular conspiracy theory, to a decidedly Christian one. While the original Q posts make little reference to Christianity (Hannah), posts from Q’s followers, called “Anons,” contain overt religious imagery and reference evangelical Christian tropes, such as the Mark of the Beast or the Last Judgment (Bodner). QAnon’s most fervent supporters are evangelical Christians. At the “God and Country Patriot Roundup,” a QAnon convention in Dallas, Texas, Andrew Callaghan of the YouTube channel “Channel 5,” spoke to Judy, a mother whose support for QAnon has estranged her from her family: “It’s spiritual warfare. This is between good and evil,” (Callaghan). How did this radical shift toward evangelicalism happen?
Scholarship tends to agree that QAnon surged in popularity and entered the mainstream once it was introduced to evangelical Christian baby boomers on Facebook (Bodner). This moment will make up the focus of my analysis. Why was QAnon so popular among evangelical Christian social media users? What role did platforms play in QAnon’s rapid growth? In analyzing the proliferation of the QAnon conspiracy theory on social media through its popularity among evangelical Christian social media users, I divide my study into two focuses: 1. The evangelical Christian social media users themselves and 2. The mainstream social media platforms they use. In analyzing the users, I will focus on three overlapping concepts: affective publics, charismatic authorities, and textual interpretation. I will attempt to draw parallels between the evangelical Christian model of social media and the QAnon influencer economy. I will attempt to explain the specific features of the QAnon conspiracy theory that allowed it to be so easily embraced by evangelical Christian social media users. In analyzing the role social media platforms have played in the spread of the QAnon conspiracy theory, I will focus on three more overlapping concepts: the attention economy, content regulation, and platform affordances. I will explain how and why platforms failed to curb QAnon’s growth, the tactics QAnon followers used to circumvent content regulation, and how platforms’ digital infrastructure encouraged the conspiracy’s propagation. These two focuses and six concepts should not be understood as distinct from each other. It is only through their interaction that the unique QAnon phenomenon was created.
QAnon and evangelical Christian social media users
In Corrina Laughlin’s article “#AmplifyWomen: the emergence of an evangelical feminist public on social media,” Laughlin explores how evangelical Christian women use feminist rhetoric to build followings on social media. One significant component of her analysis is the concept of “affective publics.” Affective publics occur when groups of people on social media interact personally with political ideas. Rather than simply receiving a message from an institution like a television network, social media users interact with one another on a personal level. This personal interaction creates affect, or emotion, which snowballs in the context of political activism. These affective interactions drive cyclical engagement with content on social media, or materialize in activism offline. “Users begin to feel part of something larger, even when that thing is amorphous and mediated; in turn they perform a version of themselves and these personal declarations of the self also contain collectivist or civic aspirations,” (Laughlin). In the context of Laughlin’s piece, women who felt sidelined in their churches or families were able to interact with other like-minded evangelical Christian women on social media. They talked about their frustrations with the church, and built their own postfeminist evangelical Christian online spaces. Though its offline effects are starkly different, QAnon provides evangelical Christian social media users with a similar outlet for activism. QAnon posts invoke intense moral urgency – the fight against the Deep State is a matter of “life and death,” and followers are referred to as “digital soldiers,” (Callaghan). It is easy to see how participating in QAnon may provide evangelical Christians, perhaps alienated from their churches, families, or communities, the opportunity to materially engage with the ideas their churches espouse. If QAnon’s fight against the Deep State really is a war between good and evil, then it is any good Christian’s moral responsibility to participate. Will Sommer of the Daily Beast summarizes QAnon’s emotional appeal: “The most powerful people in the world are committing these heinous crimes with impunity, and only you and Donald Trump can stop it… It preys on your emotions,” (Callaghan). QAnon’s networked affect materializes through cyclical engagement that further contributes to the theory’s growth, as well as catalyzes offline events, such as the Comet Ping Pong shooting, or the January 6th attack on the Capitol.
In “#AmplifyWomen,” Laughlin discusses how evangelical Christian women use their identities and postfeminist rhetoric to build social media followings, eventually gaining an outsized influence within evangelical culture at large. These influencers become “charismatic authorities,” who “rely on devotion to the exceptional sanctity, heroism or exemplary character of an individual person, and of the normative patterns or order revealed or ordained by him,” (Laughlin). In other words, these charismatic authorities use social media to perform as idealized versions of evangelical Christian women. They gain followers, claim authority, and leverage that authority to influence evangelical culture. QAnon’s influencer economy works much in the same way. The original 8chan posts are far too obscure and esoteric to grow a mass movement. QAnon was only able to reach its massive audience through the creation of its own content-creating influencers, who grew their own followings by contributing to the theory’s sprawling lore (Packer). QAnon influencers leverage their own unique identifying characteristics to provide distinct perspectives on the conspiracy theory. Comedian and author Evan Sayet, a guest-speaker at the “God and Country Patriot Roundup,” is able to use his Jewish faith to rebut the accusations of anti-semitism leveled at QAnon, or draw parallels between the treatment of Jews during the Holocaust and the censorship of right-wing social media users. Bodybuilder and QAnon podcaster Chris Eryx uses his expertise in health and fitness to dissuade his audience from wearing masks or taking the Covid-19 vaccine (Callaghan). By manipulating the tropes of existing online communities, QAnon influencers are able to convert social media users who may not otherwise be open to conspiratorial thinking. Using Scripture or evangelical Christian themes such as the Mark of the Beast or the Last Judgment allows QAnon influencers to reach ordinary evangelical Americans, a demographic already susceptible to QAnon due to their broad support for Donald Trump.
In “#AmplifyWomen,” Laughlin explores how evangelical Christianity’s lack of a central authority leaves the faith open to interpretation. Evangelical pastors build their congregations through unique interpretations of Christianity. Some emphasize the prosperity gospel, while others are open to feminism and LGBTQIA+ inclusion. These differences in perspective invite debate and discussion in online spaces. Like evangelical Christianity, QAnon’s belief system has no central authority or established canon (Bodner). Not only are the original 8chan posts not rooted in any verifiable fact, their author is unknown. Linguistic analysis suggests many different people may have authored the posts (Hannah). And while the original 8chan posts laid the foundation for the theory, QAnon’s rapid growth meant its lore quickly became far removed from those original posts anyway. Most of the theory’s extensive mythos is driven by secondary sources: its followers and influencers. The movement has remained strong despite no new posts from Q since December of 2020. As Mickey, a U.S. veteran and QAnon conference attendee, put it, “Q is the truth forever, whether there’s posts or not, as long as Anons never cease to find and spread the truth,” (Callaghan).
Rather than undermine the conspiracy theory’s legitimacy, QAnon’s lack of central authority or established canon is a major part of what makes it so successful. Because QAnon is not bound by any consistent logic, it can absorb any other tangentially-related theory without contradiction. Accusations of Satanism or child sex trafficking, anti-vaccine misinformation or COVID-19 denial, anti-semitism or Holocaust revisionism: all of these align to some extent with the broad theory that a nefarious Deep State controls America and is undermining Trump’s agenda. Charismatic authorities build brands off of their interpretations of QAnon, incorporating unique combinations of other conspiracy theories into their content. This variability between different sects of QAnon followers drives debate and discussion. Discussion and debate between social media users means more content, more engagement – and contributes further to the conspiracy theory’s growth online.
The Role of Platforms
Social media platforms are free to use, but make money selling their users’ data to advertisers. Over the last decade, it is algorithms that primarily determine the content that users see on social media sites. Algorithms “learn” about users, through metrics such as likes, follows, and watchtime, and serve them specifically-tailored content. The more users interact and engage, the better these algorithms get at recommending content. As algorithms get to know users more intimately, content recommendations become increasingly fringe, and oftentimes, extremist. As Tim Wu explains in “Attention Brokers,” platforms are in constant competition for users’ attention, because attention translates directly into advertising revenue (Wu). Controversial political content does extremely well on social media. Whether users are hate-watching or actively enjoying it, extremist content invites engagement. Platforms, intentionally or not, push people toward extreme content in order to gauge their reactions and better serve them advertisements (Ghosh). It is built into the business model. Facebook, Twitter, and Reddit profit from QAnon and other inflammatory conspiracy theories.
Content regulation is an important component of social media’s business model. Most advertisers don’t want their products displayed next to controversial content, else it be seen as an endorsement of problematic behavior. Social media sites regulate the content shared by their users so that companies continue to advertise on their platform. While graphic violence and sexual content are generally not tolerated on mainstream platforms, misinformation is more of a gray area. What qualifies as misinformation is not well-defined in the realm of social media. And because most conspiracy theories on social media overlap heavily with right-wing politics, regulation of such content typically results in accusations of politically-motivated censorship. Until QAnon became a point of discussion in mainstream media, platforms had no incentive to regulate the conspiracy theory’s growth.
In most cases, the labor of content moderation is outsourced by platforms to users. Users report content they believe violates platforms’ terms of service, and platforms then make the decision of whether or not to remove it, or take additional actions, like banning the account or the user’s IP address. However, as was discussed earlier, platforms drive users toward content the algorithm knows they will like. If users aren’t seeing content they may disagree with, it is unlikely that it will be reported, even if it violates the platforms’ terms of service. After the January 6th attack on the Capitol, Facebook banned 70,000 users posting QAnon-related content (Ghosh). Why didn’t they do this months earlier, before the November election or the conspiracy theories claiming its illegitimacy? Because social media sites force users into enclaves of other like-minded users, many Americans were unfamiliar with QAnon until the attack on the Capitol. Due to the theory’s relative obscurity, there was no public call for the removal of QAnon from social media, so Facebook and other mainstream platforms had no incentive to do so. Only after the January 6th Capitol riot were platforms forced to reckon with their complicity in QAnon’s growth.
Largely due to the aforementioned accusations of politically-motivated censorship, platforms’ approach to content regulation is typically reactive, rather than proactive. To get around regulation, Chris Eryx describes replacing words like “mask” with “face diaper”, “election fraud” with “obvious fraud,” and “vaccine” with “jokey-pokey,” (Callaghan). Users are clever, and content regulators often have to play catch-up. Regulating QAnon content also has the unintended effect of reinforcing its adherents’ beliefs. If you are fighting against a secret network of Satan-worshiping, child-abusing elites, of course Mark Zuckerberg is going to try to silence you. QAnon activist Jason Frank demonstrates this line of reasoning in his interview with Channel 5’s Andrew Callaghan: “If we weren’t making a difference, if it wasn’t that important, why would they spend so much effort and energy into it?” (Callaghan).
These are not coincidences or flukes. Profit motives and the enforcement of content regulation policies are specific and deliberate choices made by platforms themselves. QAnon and other examples of widespread misinformation are often characterized as digressions from the typically democratic, liberal space that is the Internet. QAnon’s explosive growth on social media challenges this narrative. The affordances and digital infrastructure of mainstream social media platforms encourage the proliferation of theories like QAnon. Platforms must accept responsibility for their role in these crises, and people must demand change.
Conclusion
As Richard Hofstadter demonstrates in his now-famous essay “The Paranoid Style in American Politics,” phenomena like QAnon are not entirely novel. Conspiracy theories have permeated American politics since the nation’s founding (Hofstadter). Some of their proponents have become mainstream political figures. Some have inspired horrific violence. But QAnon is unique in that it is a social media conspiracy theory: it began on and was propagated on social media (Bodner). It may be impossible to entirely stifle America’s paranoid tendencies or to stop the rise of opportunistic political actors, but I think QAnon’s unique position should inspire more hope than fear.
Social media often appears so sprawling, so ubiquitous, that it can seem ungovernable. Its offline ramifications are often taken for granted as immutable features of 21st century American society. But social media companies have more control over their platforms than they care to let on, as is demonstrated by Twitter and Facebook’s ban of a sitting president (Ghosh). Our public spheres have been privatized through social media. But because these platforms are able to be regulated with relative ease, they are responsive to democratic oversight. Proper governance of social media platforms could have curbed the spread of QAnon, and may have prevented the 2021 Capitol attack. Studying QAnon’s growth in reference to American evangelicalism and mainstream social media platforms should help to inform how we govern information online in the future.
Works Cited
Bodner, John, et al. COVID-19 Conspiracy Theories : QAnon, 5G, the New World Order and
Other Viral Ideas, McFarland & Company, Incorporated Publishers, 2020. ProQuest Ebook Central, https://ebookcentral-proquest-com.proxy.library.nyu.edu/lib/nyulibrary-ebooks/detail.action?docID=6396378.
Callaghan, Andrew, et al. “Q Conference.” YouTube, 28 July 2021,
https://youtu.be/KYKOLwt8pwo.
Ghosh, Dipayan. “Are We Entering a New Era of Social Media Regulation?” Harvard Business
Review, 14 Jan. 2021, https://hbr.org/2021/01/are-we-entering-a-new-era-of-social-media-regulation.
Hannah, Matthew N. “A Conspiracy of Data: QAnon, Social Media, and Information
Visualization.” Social Media + Society, July 2021, doi:10.1177/20563051211036064.
Hofstadter, Richard. “The Paranoid Style in American Politics, by Richard Hofstadter.” Harper's
Magazine, Harper's Magazine, 23 Mar. 2021, https://harpers.org/archive/1964/11/the-paranoid-style-in-american-politics/.
Laughlin, Corrina. “#AmplifyWomen: the emergence of an evangelical feminist public on social
media.” Feminist Media Studies, 21:5, 807-821, 2021. DOI: 10.1080/14680777.2020.1711794
Packer, Joseph, and Ethan Stoneman. “Where We Produce One, We Produce All: The Platform
Conspiracism of QAnon.” Cultural Politics 1 November 2021; 17 (3): 255–278. doi: https://doi-org.proxy.library.nyu.edu/10.1215/17432197-9305338.
Wu, Tim. “Attention Brokers.” NYU Law, Columbia Law School,
http://www.law.nyu.edu/sites/default/files/upload_documents/Tim%20Wu%20-%20Attention%20Brokers.pdf.